30 research outputs found

    Recovery And Migration Of Application Logic From Legacy Systems

    Get PDF
    Future Internet technologies necessitate dramatic changes in system design, deliveryand usage patterns. For many legacy applications it means that their furtherdevelopment and transition to the Internet becomes problematic or evenimpossible due to the obsolescence of technologies they use. Replacement ofthe old system with the new one, built from scratch, is usually economicallyunacceptable. Therefore, there is a call for methods and tools supportingthe automated migration of legacy systems into a new paradigm. This paperproposes a tool supported method for recovery and migration of applicationlogic information from legacy systems. The information extracted from a legacyapplication is stored in the form of precise requirement-level models enablingautomated transformation into a new system structure in a model-driven way.Evaluation of the approach is based on a case study legacy system

    A data quality management framework to support delivery and consultancy of CRM platforms

    Get PDF
    CRM platforms heavily depend on high-quality data, where poor-quality data can negatively influence its adoption. Additionally, these platforms are increasingly interconnected and complex to meet growing needs of customers. Hence, delivery and consultancy of CRM platforms becomes highly complex. In this study, we propose a CRM data quality management framework that supports CRM delivery and consultancy firms to improve data quality management practices within their projects. The framework should also improve data quality within CRM solutions for their clients. We extract best practices for CRM data quality management by means of a literature study on data quality definition and measurement, data quality challenges, and data quality management methods. In a case study at an IT consultancy company, we investigate how CRM delivery and consultancy projects can benefit from the incorporation of data quality management practices. The design of the framework is validated by means of confirmatory focus groups and a questionnaire. The results translate into a framework that provides a high-level overview of data quality management practices incorporated in CRM delivery and consultancy projects. It includes the following components: Client profiling, project definition, preparation, migration/integration, data quality definition, assessment, and improvement

    From requirements to Java in a snap: model-driven requirements engineering in practice

    No full text
    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and c

    Enterprise Information Systems

    No full text
    International audienc

    Teaching Software Modeling in Computing Curricula

    No full text
    Modeling is a key skill in software development. The ability to develop, manipulate and understand models for software is therefore an important learning objective in many CS/SE courses. In this working group, we investigated how and when (software) modeling is taught to help us better understand the key issues in teaching (software) modeling. Several shortcomings were found in common curricula, both in their understanding of the term "modeling" and in how they address its teaching. This WG report summarizes the findings and formulates recommendations on the inclusion of software modeling courses in future CS/SE curricula

    An Architecture Principle Measurement Instrument Tested in Real-Life

    No full text
    A high percentage of information system projects still fails due to poor implementation of requirements. Over the years, investigations by numerous scientists suggest that architecture principles are important in the successful implementation of those information systems requirements. However, these investigations are of a theoretical nature; until now, no validation in practice has taken place. Our research stresses this empirical validation: do architecture principles work in real-life situations? To find this evidence, we need an instrument to measure architecture principles, in order to establish the connection between principle and project success. The focus of this paper is such an architecture principle measurement instrument. We describe the results of a literature study, yielding both the definition and the characteristics of the architecture principle. Besides the measurement instrument, we describe the related measurement method, including the test in a real-life case. Based on the outcome of the case study, we extend the instrument with additional architecture principles characteristics and attributes, and we improve the measurement method

    Entity resolution in large patent databases: An optimization approach

    No full text
    Entity resolution in databases focuses on detecting and merging entities that refer to the same real-world object. Collective resolution is among the most prominent mechanisms suggested to address this challenge since the resolution decisions are not made independently, but are based on the available relationships within the data. In this paper, we introduce a novel resolution approach that combines the essence of collective resolution with rules and transformations among entity attributes and values. We illustrate how the approach’s parameters are optimized based on a global optimization algorithm, i.e., simulated annealing, and explain how this optimization is performed using a small training set. The quality of the approach is verified through an extensive experimental evaluation with 40M real-world scientific entities from the Patstat database

    IT Project Portfolio Management: Development and Validation of a Reference Model

    No full text
    IT Project Portfolio Management has been implemented in most organizations to effectively manage complex portfolios of IT projects and balance them with business strategy. Several standards for portfolio management have been published, but the scientific literature still lacks a theoretically grounded and practically validated reference model for analyzing the implementation of IT Project Portfolio Management in an organization. Therefore, this study designs and validates a reference model for systematically analyzing IT Project Portfolio Management design choices in an organization in terms of processes, roles, responsibilities, and authority. Organizations can use the reference model to systematically assess their local implementation of IT Project Portfolio Management and identify areas for improvement

    Constraint Formalization for Automated Assessment of Enterprise Models

    No full text
    Enterprises always do their business within some restrictions. In a team of enterprise architects, the restrictions are transformed into the modelling conventions and the corresponding modelling constraints that should be consistently applied across all enterprise models. This paper presents an approach for refining and formali\-zing modeling conventions into modelling constraints and using them for assessment of enterprise models by a software component called ArchiChecker. The specifics of the proposed approach is that the modeling conventions are first visualized and formalized using the types of elements and relationships of the ArchiMate modeling language, that is also used for modelling of enterprise views. The ArchiMate elements and relationships serve as types to formulate constraints. The elements and relationships in an ArchiMate model are instances of the ArchiMate elements and relationships. Using these types and instances the ArchiChecker automatically generates the lists of violations of modeling conventions in the enterprise models. Each violation shows how a specific enterprise view deviates from a given modeling convention. The paper reports a case study of application of the proposed approach to enterprise modelling views and modelling conventions used in a medical center. The case study is used to discuss the added value of formalization and automated assessment of modelling constraints in enterprise modelling

    Data Governance and Information Governance: Set of Definitions in Relation to Data and Information as Part of DIKW

    No full text
    Chaos emerges with the ever growing amounts of data and information within organisations. But it is problematic to manage these valuable assets and also remain accountable and compliant for them because there is no agreement about even theirdefinitions. Our objective is to propose a coherent set of definitions for data governance and information governance within and across organisations in relation with data and information as underlying concepts. As a research method,we explore elements from existing definitions in literature aboutthe Data-Information-Knowledge-Wisdom pyramid and aboutdata governance and information governance. Classification of these elements and coding themin concepts during discussions among peers resulted in a new vocabulary. This forms the basis for formulation and design of an original coherent set of definitions for data, information, meaning, data governance and information governance. This research is grounded, goal oriented and uses multiple accepted literature review methods. But it is limited to the literature found and the IS domai
    corecore